How Much Randomness Can Be Extracted from Memoryless Shannon Entropy Sources?

نویسنده

  • Maciej Skorski
چکیده

We revisit the classical problem: given a memoryless source having a certain amount of Shannon Entropy, how many random bits can be extracted? This question appears in works studying random number generators built from physical entropy sources. Some authors use a heuristic estimate obtained from the Asymptotic Equipartition Property, which yields roughly n extractable bits, where n is the total Shannon entropy amount. However the best known precise form gives only n − O( √ log(1/ )n), where is the distance of the extracted bits from uniform. In this paper we show a matching n−Ω( √ log(1/ )n) upper bound. Therefore, the loss of Θ( √ log(1/ )n) bits is necessary. As we show, this theoretical bound is of practical relevance. Namely, applying the imprecise AEP heuristic to a mobile phone accelerometer one might overestimate extractable entropy even by 100%, no matter what the extractor is. Thus, the “AEP extracting heuristic” should not be used without taking the precise error into account.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Comprehensive Comparison of Shannon Entropy and Smooth Renyi Entropy

We provide a new result that links two crucial entropy notions: Shannon Entropy H1 and collision entropy H2. Our formula gives the worst possible amount of collision entropy in a probability distribution, when its Shannon Entropy is fixed. Our results and techniques used in the proof immediately imply many quantitatively tight separations between Shannon and smooth Renyi entropy, which were pre...

متن کامل

Shannon Entropy Versus Renyi Entropy from a Cryptographic Viewpoint

We provide a new inequality that links two important entropy notions: Shannon Entropy H1 and collision entropy H2. Our formula gives the worst possible amount of collision entropy in a probability distribution, when its Shannon Entropy is fixed. While in practice it is easier to evaluate Shannon entropy than other entropy notions, it is well known in folklore that it does not provide a good est...

متن کامل

Regularities unseen, randomness observed: levels of entropy convergence.

We study how the Shannon entropy of sequences produced by an information source converges to the source's entropy rate. We synthesize several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes by using successive derivatives of the Shannon entropy growth curve. This leads, in turn, to natural measures of appa...

متن کامل

MATH 3989 Mathematics Project NON - ASYMPTOTIC EQUIPARTITION PROPERTIES FOR HIDDEN MARKOV PROCESSES

A hidden Markov process (HMP) is a discrete-time finite-state homogeneous Markov chain observed through a discrete-time memoryless invariant channel. The non-asymptotic equipartition property (NEP) is a bound on the probability of the sample entropy deviating from the entropy rate of a stochastic process, so it can be viewed as a refinement of Shannon-McMillan-Breiman theorem. In this report, w...

متن کامل

Tsallis Entropy and Conditional Tsallis Entropy of Fuzzy Partitions

The purpose of this study is to define the concepts of Tsallis entropy and conditional Tsallis entropy of fuzzy partitions and to obtain some results concerning this kind entropy. We show that the Tsallis entropy of fuzzy partitions has the subadditivity and concavity properties. We study this information measure under the refinement and zero mode subset relations. We check the chain rules for ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IACR Cryptology ePrint Archive

دوره 2015  شماره 

صفحات  -

تاریخ انتشار 2015